History of the United States
They were originally British colonies at the eastern coast.
After the war between France and Britain, Britain started taxing the colonies.
The colonies demanded self-determination, so they united and fought a war against the UK.
France helped them win the war, and they began colonizing the entire continent.